477 research outputs found
Foreclosure Echo: How Abandoned Foreclosures are Re-Entering the Market Through Debt Buyers
It is common knowledge that mortgage defaults increased steadily from 2006 through 2011. In some situations, lenders moved swiftly after default to foreclose the property; but for other homeowners the foreclosure process began and then stalled or was completely abandoned by the lender. The result of these abandoned foreclosures has been devastating to cities and consumers throughout the country. This article explores what is happening to homeowners caught up in the strange world of bank walkaways as the economy is beginning to improve. This second wave of collection activity, an echo of the original foreclosure crisis, could easily throw thousands of consumers back into financial hardship just as the economic recovery begins. Part I of this article explores the evidence of foreclosures started and then stalled or abandoned and their impact on consumers and communities. In Part II the real zombie title is introduced through evidence gathered in foreclosures in Indiana. This new form of zombie loan is a mortgage loan that has been foreclosed, but is suddenly and inexplicably un-foreclosed. The effect of zombie loans on homeowner, judicial system and communities is also explored. Finally, Part III discusses the increased presence of debt buyers in both the buying of loans and the collection of deficiency judgment in relation to the overall concern currently being voiced regarding the debt buying industry. The clever ways banks are managing their foreclosure inventory make clear that the effects of zombie loans must be mitigated in order to avoid a second economic downturn, the foreclosure echo
The Future of Foreclosure Law in the Wake of the Great Housing Crisis of 2007-2014
As 2014 came to an end so, perhaps, did the worst foreclosure crisis in U.S. history. On January 15, 2015, RealityTrac, one of the nation’s leading reporters of housing data, declared the foreclosure crisis had ended. Whether or not their declaration proves true, the aftermath of the crisis will be felt for years to come. During the crisis it is estimated more than five million families lost their homes to foreclosure. Federal, state and local responses to the crisis changed laws and perceptions regarding foreclosure. Despite these changes, we end the crisis much the way we began -- with a nationwide foreclosure system mistrusted and disliked by lenders and consumers alike. This paper examines the responses to the crisis in an effort to determine what worked, what did not, and where foreclosure law should go from here. In the end, it is clear that we need a more uniform system, but one that also prioritizes homeownership, or at least home occupancy
Foreclosure Echo: How Abandoned Foreclosures are Re-Entering the Market Through Debt Buyers
It is common knowledge that mortgage defaults increased steadily from 2006 through 2011. In some situations, lenders moved swiftly after default to foreclose the property; but for other homeowners the foreclosure process began and then stalled or was completely abandoned by the lender. The result of these abandoned foreclosures has been devastating to cities and consumers throughout the country. This article explores what is happening to homeowners caught up in the strange world of bank walkaways as the economy is beginning to improve. This second wave of collection activity, an echo of the original foreclosure crisis, could easily throw thousands of consumers back into financial hardship just as the economic recovery begins. Part I of this article explores the evidence of foreclosures started and then stalled or abandoned and their impact on consumers and communities. In Part II the real zombie title is introduced through evidence gathered in foreclosures in Indiana. This new form of zombie loan is a mortgage loan that has been foreclosed, but is suddenly and inexplicably un-foreclosed. The effect of zombie loans on homeowner, judicial system and communities is also explored. Finally, Part III discusses the increased presence of debt buyers in both the buying of loans and the collection of deficiency judgment in relation to the overall concern currently being voiced regarding the debt buying industry. The clever ways banks are managing their foreclosure inventory make clear that the effects of zombie loans must be mitigated in order to avoid a second economic downturn, the foreclosure echo
A Tale of Two Data-Intensive Paradigms: Applications, Abstractions, and Architectures
Scientific problems that depend on processing large amounts of data require
overcoming challenges in multiple areas: managing large-scale data
distribution, co-placement and scheduling of data with compute resources, and
storing and transferring large volumes of data. We analyze the ecosystems of
the two prominent paradigms for data-intensive applications, hereafter referred
to as the high-performance computing and the Apache-Hadoop paradigm. We propose
a basis, common terminology and functional factors upon which to analyze the
two approaches of both paradigms. We discuss the concept of "Big Data Ogres"
and their facets as means of understanding and characterizing the most common
application workloads found across the two paradigms. We then discuss the
salient features of the two paradigms, and compare and contrast the two
approaches. Specifically, we examine common implementation/approaches of these
paradigms, shed light upon the reasons for their current "architecture" and
discuss some typical workloads that utilize them. In spite of the significant
software distinctions, we believe there is architectural similarity. We discuss
the potential integration of different implementations, across the different
levels and components. Our comparison progresses from a fully qualitative
examination of the two paradigms, to a semi-quantitative methodology. We use a
simple and broadly used Ogre (K-means clustering), characterize its performance
on a range of representative platforms, covering several implementations from
both paradigms. Our experiments provide an insight into the relative strengths
of the two paradigms. We propose that the set of Ogres will serve as a
benchmark to evaluate the two paradigms along different dimensions.Comment: 8 pages, 2 figure
Fairness and Privacy in Federated Learning and Their Implications in Healthcare
Currently, many contexts exist where distributed learning is difficult or
otherwise constrained by security and communication limitations. One common
domain where this is a consideration is in Healthcare where data is often
governed by data-use-ordinances like HIPAA. On the other hand, larger sample
sizes and shared data models are necessary to allow models to better generalize
on account of the potential for more variability and balancing underrepresented
classes. Federated learning is a type of distributed learning model that allows
data to be trained in a decentralized manner. This, in turn, addresses data
security, privacy, and vulnerability considerations as data itself is not
shared across a given learning network nodes. Three main challenges to
federated learning include node data is not independent and identically
distributed (iid), clients requiring high levels of communication overhead
between peers, and there is the heterogeneity of different clients within a
network with respect to dataset bias and size. As the field has grown, the
notion of fairness in federated learning has also been introduced through novel
implementations. Fairness approaches differ from the standard form of federated
learning and also have distinct challenges and considerations for the
healthcare domain. This paper endeavors to outline the typical lifecycle of
fair federated learning in research as well as provide an updated taxonomy to
account for the current state of fairness in implementations. Lastly, this
paper provides added insight into the implications and challenges of
implementing and supporting fairness in federated learning in the healthcare
domain
Interpreting County Level COVID-19 Infection and Feature Sensitivity using Deep Learning Time Series Models
Interpretable machine learning plays a key role in healthcare because it is
challenging in understanding feature importance in deep learning model
predictions. We propose a novel framework that uses deep learning to study
feature sensitivity for model predictions. This work combines sensitivity
analysis with heterogeneous time-series deep learning model prediction, which
corresponds to the interpretations of spatio-temporal features. We forecast
county-level COVID-19 infection using the Temporal Fusion Transformer. We then
use the sensitivity analysis extending Morris Method to see how sensitive the
outputs are with respect to perturbation to our static and dynamic input
features. The significance of the work is grounded in a real-world COVID-19
infection prediction with highly non-stationary, finely granular, and
heterogeneous data. 1) Our model can capture the detailed daily changes of
temporal and spatial model behaviors and achieves high prediction performance
compared to a PyTorch baseline. 2) By analyzing the Morris sensitivity indices
and attention patterns, we decipher the meaning of feature importance with
observational population and dynamic model changes. 3) We have collected 2.5
years of socioeconomic and health features over 3142 US counties, such as
observed cases and deaths, and a number of static (age distribution, health
disparity, and industry) and dynamic features (vaccination, disease spread,
transmissible cases, and social distancing). Using the proposed framework, we
conduct extensive experiments and show our model can learn complex interactions
and perform predictions for daily infection at the county level. Being able to
model the disease infection with a hybrid prediction and description accuracy
measurement with Morris index at the county level is a central idea that sheds
light on individual feature interpretation via sensitivity analysis
World of Viruses: the Frozen Horror
https://digitalcommons.unmc.edu/coph_books/1000/thumbnail.jp
- …